A theory is proposed for designing decision support systems (DSS) so that the confidence a decision maker has in a decision made using the aid equals the quality of that decision. The DSS design theory for user calibration prescribes properties of a DSS needed for users to achieve perfect calibration. Relevant calibration, decision making, and DSS literatures are synthesized; and related behavioral theories are borrowed to identify the properties of expressiveness, visibility, and inquirability as requisite components of the DSS design theory for user calibration.
This paper provides an overview report of the first joint curriculum development effort for undergraduate programs in information systems. The curriculum recommendations am a collaborative effort of the following organizations: ACM, AIS, DPMA, and ICIS. After a summary of the objectives and rationale for the curriculum, the curriculum model is described. Input and output attributes of graduates are delineated. Resource requirements for effective IS programs are then identified. Lastly, there is a proposal for maintaining currency of the curriculum through electronic media.
The optimal amount of information needed in a given decision-making situation lies somewhere along a continuum from "not enough" to "too much". Ackoff proposed that information systems often hinder the decision-making process by creating information overload. To deal with this problem, he called for systems that could filter and condense data so that only relevant information reached the decision maker. The potential for information overload is especially critical in text-based information. The purpose of this research is to investigate the effects and theoretical limitations of extract condensing as a text processing tool in terms of recipient performance. In the experiment described here, an environment is created in which the effects of text condensing are isolated from the effects of message and individual recipient differences. The data show no difference in reading comprehension performance between the condensed forms and the original document. This indicates that condensed forms can be produced that are equally as informative as the original document. These results suggest that it is possible to apply a relatively simple computer algorithm to text and produce extracts that capture enough of the information contained in the original document so that the recipient can perform as if he or she had read the original. These results also identify a methodology for assessing the effectiveness of text condensing schemes. The research presented here contributes to a small but growing body of work on text-based information systems and, specifically, text condensing.
The article discusses the Plains Cotton Cooperative Association's (PCCA) use of the computer-based trading system TELCOT. TELCOT, a computer-based system developed by PCCA, provides cotton traders with functions much like those available to New York Stock Exchange traders. The author states that about half of the U.S. cotton crop is grown in Texas and Oklahoma and this cotton amounts to around 10 percent of the world's annual crop. The author explains that in most areas of the U.S. producers collectively market their cotton in pools in which the grower signs a contract and the title to his or her cotton passes to the pool. When the cotton is purchased the grower receives the average price for that specific type of cotton. This means that the grower assumes the market risk. Article topics also include the TELCOT Automated Counter Offer program and the development of TELCOT as a strategic advantage.
The concept of a collaborative human-computer interchange was proposed almost thirty years ago. The goal of this paradigm is to design human-computer decision-making systems that think and process information at a level exceeding that of either the human or the computer alone. Technological and conceptual developments have made this holistic partnership increasingly possible. Moreover, recent discussions of human-computer collaborative work have highlighted the system performance advantages of this interchange. In this paper, the notion of human-computer interchange protocols is developed and the importance of these protocols to human-computer collaboration and system performance is argued. Based on data collected in a laboratory setting, empirical support for the proposed holistic effect of human-computer interchange protocols on system performance is provided. Decision performance is significantly improved by interchange protocols that encourage human-computer interaction during the problem-solving process.